Learning Dynamic Latent Spaces for Lifelong Generative Modelling
نویسندگان
چکیده
Task Free Continual Learning (TFCL) aims to capture novel concepts from non-stationary data streams without forgetting previously learned knowledge. Mixture models, which add new components when certain conditions are met, have shown promising results in TFCL tasks. However, such approaches do not make use of the knowledge already accumulated for positive transfer. In this paper, we develop a model, namely Online Recursive Variational Autoencoder (ORVAE). ORVAE utilizes prior by selectively incorporating newly learnt information, adding components, according known past data. We introduce attention mechanism regularize structural latent space most important information is reused while that interferes with samples inactivated. The proposed can maximize benefit forward transfer learning perform several experiments show achieves state-of-the-art under TFCL.
منابع مشابه
Lifelong Learning in Costly Feature Spaces
An important long-term goal in machine learning systems is to build learning agents that, like humans, can learn many tasks over their lifetime, and moreover use information from these tasks to improve their ability to do so efficiently. In this work, our goal is to provide new theoretical insights into the potential of this paradigm. In particular, we propose a lifelong learning framework that...
متن کاملLifelong Generative Modeling
Lifelong learning is the problem of learning multiple consecutive tasks in a sequential manner where knowledge gained from previous tasks is retained and used for future learning. It is essential towards the development of intelligent machines that can adapt to their surroundings. In this work we focus on a lifelong learning approach to generative modeling where we continuously incorporate newl...
متن کاملLifelong Learning for Lifelong Employment
SOMEONE ASKED ME recently, “How do we keep 40-year-old software developers employed?” At rst I was puzzled. I had little clue this was a problem. Isn’t there more demand than supply for software developers? However, imagine a software developer who graduates from a good engineering school and gets a good job in a large high-tech company. He marries and raises a family, is good at barbecue, runs...
متن کاملSemantically Decomposing the Latent Spaces of Generative Adversarial Networks
We propose a new algorithm for training generative adversarial networks that jointly learns latent codes for both identities (e.g. individual humans) and observations (e.g. specific photographs). By fixing the identity portion of the latent codes, we can generate diverse images of the same subject, and by fixing the observation portion, we can traverse the manifold of subjects while maintaining...
متن کاملLatent semantic analysis as a tool for learner positioning in learning networks for lifelong learning
As we move towards distributed, self-organized learning networks for lifelong learning to which multiple providers contribute content, there is a need to develop new techniques to determine where learners can be positioned in these networks. Positioning requires us to map characteristics of the learner onto characteristics of learning materials and curricula. Considering the nature of the netwo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i9.26291